Student-t Processes as Alternatives to Gaussian Processes

نویسندگان

  • Amar Shah
  • Andrew Gordon Wilson
  • Zoubin Ghahramani
چکیده

We investigate the Student-t process as an alternative to the Gaussian process as a nonparametric prior over functions. We derive closed form expressions for the marginal likelihood and predictive distribution of a Student-t process, by integrating away an inverse Wishart process prior over the covariance kernel of a Gaussian process model. We show surprising equivalences between different hierarchical Gaussian process models leading to Student-t processes, and derive a new sampling scheme for the inverse Wishart process, which helps elucidate these equivalences. Overall, we show that a Studentt process can retain the attractive properties of a Gaussian process – a nonparametric representation, analytic marginal and predictive distributions, and easy model selection through covariance kernels – but has enhanced flexibility, and predictive covariances that, unlike a Gaussian process, explicitly depend on the values of training observations. We verify empirically that a Student-t process is especially useful in situations where there are changes in covariance structure, or in applications like Bayesian optimization, where accurate predictive covariances are critical for good performance. These advantages come at no additional computational cost over Gaussian processes.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Filtering Outliers in Bayesian Optimization

Jarno Vanhatalo, Pasi Jylänki, and Aki Vehtari. Gaussian process regression with Student-t likelihood. In NIPS, pages 1910–1918, 2009. Amar Shah, Andrew Gordon Wilson, and Zoubin Ghahramani. Student-t processes as alternatives to Gaussian processes. In AISTATS, pages 877–885, 2014. Anthony O'Hagan. On outlier rejection phenomena in Bayes inference. Journal of the Royal Statistical Society. Seri...

متن کامل

The Rate of Entropy for Gaussian Processes

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

متن کامل

Complete convergence of moving-average processes under negative dependence sub-Gaussian assumptions

The complete convergence is investigated for moving-average processes of doubly infinite sequence of negative dependence sub-gaussian random variables with zero means, finite variances and absolutely summable coefficients. As a corollary, the rate of complete convergence is obtained under some suitable conditions on the coefficients.

متن کامل

Hypervolume-based Multi-objective Bayesian Optimization with Student-t Processes

Student-t processes have recently been proposed as an appealing alternative nonparameteric function prior. They feature enhanced flexibility and predictive variance. In this work the use of Student-t processes are explored for multiobjective Bayesian optimization. In particular, an analytical expression for the hypervolume-based probability of improvement is developed for independent Student-t ...

متن کامل

Properties of Spatial Cox Process Models

Probabilistic properties of Cox processes of relevance for statistical modeling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties and point process operations such as thinning, displacements, and superpositioning. We also discuss...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014